A Non-convex Approach for Sparse Recovery with Convergence Guarantee

نویسندگان

  • Laming Chen
  • Yuantao Gu
چکیده

In the area of sparse recovery, numerous researches hint that non-convex penalties might induce better sparsity than convex ones, but up until now the non-convex algorithms lack convergence guarantee from the initial solution to the global optimum. This paper aims to provide theoretical guarantee for sparse recovery via non-convex optimization. The concept of weak convexity is incorporated into a class of sparsity-inducing penalties to characterize their non-convexity. It is shown that in a neighborhood of the sparse signal (with radius in inverse proportion to the non-convexity), any local optimum can be regarded as a stable solution. It is further proved that if the non-convexity of the penalty function is below a threshold, the initial solution also belongs to this neighborhood. In addition, The idea of projected (sub)gradient method is generalized to solve this non-convex optimization problem. A uniform approximate projection can also be applied in the projection step to make the algorithm computationally tractable for large scale problems. The theoretical convergence analysis of these methods is provided in the noisy scenario. The result reveals that if the non-convexity is below a threshold, these methods would converge from the initial solution, and the recovered solution is with recovery error linear in both the noise term and the step size. Numerical simulations are performed to test the performance of the proposed approach and verify the theoretical analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property

The main shortcoming of sparse recovery with a convex regularizer is that it is a biased estimator and therefore will result in a suboptimal performance in many cases. Recent studies have shown, both theoretically and empirically, that non-convex regularizer is able to overcome the biased estimation problem. Although multiple algorithms have been developed for sparse recovery with non-convex re...

متن کامل

Optimal energy management of the photovoltaic based distribution networks considering price responsive loads, energy storage systems and convex power flows.

Nowadays, presence of photovoltaic systems in distribution network is not without challenge and it may not have economic productivity for the system under the lack of optimal management. Energy storage systems are able to cope with this problem. Therefore, in this paper, a new method is proposed for energy management of the distribution networks in order to show that how presence of the energy ...

متن کامل

A Nonconvex Free Lunch for Low-Rank plus Sparse Matrix Recovery

We study the problem of low-rank plus sparse matrix recovery. We propose a generic and efficient nonconvex optimization algorithm based on projected gradient descent and double thresholding operator, with much lower computational complexity. Compared with existing convex-relaxation based methods, the proposed algorithm recovers the low-rank plus sparse matrices for free, without incurring any a...

متن کامل

An Intelligent Approach Based on Meta-Heuristic Algorithm for Non-Convex Economic Dispatch

One of the significant strategies of the power systems is Economic Dispatch (ED) problem, which is defined as the optimal generation of power units to produce energy at the lowest cost by fulfilling the demand within several limits. The undeniable impacts of ramp rate limits, valve loading, prohibited operating zone, spinning reserve and multi-fuel option on the economic dispatch of practical p...

متن کامل

Convex Optimization without Projection Steps

We study the general problem of minimizing a convex function over a compact convex domain. We will investigate a simple iterative approximation algorithm based on the method by Frank & Wolfe [FW56], that does not need projection steps in order to stay inside the optimization domain. Instead of a projection step, the linearized problem defined by a current subgradient is solved, which gives a st...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013